Sum-of-squares hierarchies for binary polynomial optimization

نویسندگان

چکیده

We consider the sum-of-squares hierarchy of approximations for problem minimizing a polynomial f over boolean hypercube $${{\mathbb {B}}^{n}=\{0,1\}^n}$$ . This provides each integer $$r \in {\mathbb {N}}$$ lower bound $$\smash {f_{({r})}}$$ on minimum $$f_{\min }$$ f, given by largest scalar $$\lambda $$ which $$f - \lambda is $${\mathbb {B}}^{n}$$ with degree at most 2r. analyze quality these bounds estimating worst-case error }- \smash in terms least roots Krawtchouk polynomials. As consequence, fixed $$t [0, 1/2]$$ , we can show that this regime \approx t \cdot n$$ order $$1/2 \sqrt{t(1-t)}$$ as n tends to $$\infty Our proof combines classical Fourier analysis kernel technique and existing results extremal link orthogonal polynomials relies connection between another upper {f^{({r})}}$$ are also able establish same analysis. extends minimization q-ary cube $$\mathbb ({\mathbb {Z}}/ q{\mathbb {Z}})^{n}$$ Furthermore, our apply setting matrix-valued

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Tight Sum-Of-Squares Lower Bounds for Binary Polynomial Optimization Problems

We give two results concerning the power of the Sum-of-Squares(SoS)/Lasserre hierarchy. For binary polynomial optimization problems of degree 2d and an odd number of variables n, we prove that n+2d−1 2 levels of the SoS/Lasserre hierarchy are necessary to provide the exact optimal value. This matches the recent upper bound result by Sakaue, Takeda, Kim and Ito. Additionally, we study a conjectu...

متن کامل

Sum of Squares and Polynomial Convexity

The notion of sos-convexity has recently been proposed as a tractable sufficient condition for convexity of polynomials based on sum of squares decomposition. A multivariate polynomial p(x) = p(x1, . . . , xn) is said to be sos-convex if its Hessian H(x) can be factored as H(x) = M (x) M (x) with a possibly nonsquare polynomial matrix M(x). It turns out that one can reduce the problem of decidi...

متن کامل

Sum of Squares Programs and Polynomial Inequalities

How can one find real solutions (x1, x2)? How to prove that they do not exist? And if the solution set is nonempty, how to optimize a polynomial function over this set? Until a few years ago, the default answer to these and similar questions would have been that the possi­ ble nonconvexity of the feasible set and/or objective function precludes any kind of analytic global results. Even today, t...

متن کامل

relaxed stabilization conditions via sum of squares approach for the nonlinear polynomial model

in this paper, stabilization conditions and controller design for a class of nonlinear systems are proposed. the proposed method is based on the nonlinear feedback, quadratic lyapunov function and heuristic slack matrices definition. these slack matrices in null products are derived using the properties of the system dynamics. based on the lyapunov stability theorem and sum of squares (sos) dec...

متن کامل

Regularization Methods for Sum of Squares Relaxations in Large Scale Polynomial Optimization

We study how to solve sum of squares (SOS) and Lasserre’s relaxations for large scale polynomial optimization. When interior-point type methods are used, typically only small or moderately large problems could be solved. This paper proposes the regularization type methods which would solve significantly larger problems. We first describe these methods for general conic semidefinite optimization...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Mathematical Programming

سال: 2022

ISSN: ['0025-5610', '1436-4646']

DOI: https://doi.org/10.1007/s10107-021-01745-9